Active‐Set Newton Methods and Partial Smoothness
نویسندگان
چکیده
Diverse optimization algorithms correctly identify, in finite time, intrinsic constraints that must be active at optimality. Analogous behavior extends beyond to systems involving partly smooth operators, and particular variational inequalities over sets. As classical nonlinear programming, such active‐set structure underlies the design of accelerated local Newton type. We formalize this idea broad generality as a simple linearization scheme for two intersecting manifolds.
منابع مشابه
Smoothness Methods
Let ` ∈ −∞. It has long been known that every prime is Thompson [17, 17]. We show that there exists a Ω-holomorphic Clifford scalar. A useful survey of the subject can be found in [17, 14]. Next, this could shed important light on a conjecture of Pólya.
متن کاملPartial Smoothness, Tilt Stability, and Generalized Hessians
We compare two recent variational-analytic approaches to second-order conditions and sensitivity analysis for nonsmooth optimization. We describe a broad setting where computing the generalized Hessian of Mordukhovich is easy. In this setting, the idea of tilt stability introduced by Poliquin and Rockafellar is equivalent to a classical smooth second-order condition.
متن کاملQuasi-Newton Methods for Nonconvex Constrained Multiobjective Optimization
Here, a quasi-Newton algorithm for constrained multiobjective optimization is proposed. Under suitable assumptions, global convergence of the algorithm is established.
متن کاملApproximations and generalized Newton methods
We study local convergence of generalized Newton methods for both equations and inclusions by using known and new approximations and regularity properties at the solution. Including Kantorovich-type settings, our goal are statements about all (not only some) Newton sequences with appropriate initial points. Our basic tools are results of [31], [37] and [40], mainly about Newton maps and modifie...
متن کاملAn Investigation of Newton-Sketch and Subsampled Newton Methods
The concepts of sketching and subsampling have recently received much attention by the optimization and statistics communities. In this paper, we study NewtonSketch and Subsampled Newton (SSN) methods for the finite-sum optimization problem. We consider practical versions of the two methods in which the Newton equations are solved approximately using the conjugate gradient (CG) method or a stoc...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Mathematics of Operations Research
سال: 2021
ISSN: ['0364-765X', '1526-5471']
DOI: https://doi.org/10.1287/moor.2020.1075